Block Transform Adaptation by Stochastic Gradient Descent
نویسندگان
چکیده
The problem of computing the eigendecomposition of an N N symmetric matrix is cast as an unconstrained minimization of either of two performance measures. The K = N(N 1)=2 independent parameters represent angles of distinct Givens rotations. Gradient descent is applied to the minimization problem, step size bounds for local convergence are given, and similarities to LMS adaptive filtering are noted. In adaptive transform coding it is often desirable for the transform to approximate a local KarhunenLoève Transform for the source. Determining such a transform is equivalent to finding the eigenvectors of the correlation matrix of the source; thus, the eigendecomposition methods developed here are applicable to adaptive transform coding.
منابع مشابه
Adaptive Transform Coding Using LMS-like Principal Component Tracking
A new set of algorithms for transform adaptation in adaptive transform coding is presented. These algorithms are inspired by standard techniques in adaptive nite impulse response (FIR) Wiener ltering and demonstrate that similar algorithms with simple updates exist for tracking principal components (eigenvectors of a correlation matrix). For coding an N -dimensional source, the transform adapta...
متن کاملGOYAL & VETTERLI : ADAPTIVE TRANSFORM CODING USING LMS - LIKE . . . 1 Adaptive Transform Coding UsingLMS - like Principal Component
A new set of algorithms for transform adaptation in adaptive transform coding is presented. These algorithms are inspired by standard techniques in adaptive nite impulse response (FIR) Wiener ltering and demonstrate that similar algorithms with simple updates exist for tracking principal components (eigenvectors of a correlation matrix). For coding an N-dimensional source, the transform adaptat...
متن کاملFast incremental adaptation using maximum likelihood regression and stochastic gradient descent
Adaptation to a new speaker or environment is becoming very important as speech recognition systems are deployed in unpredictable real world situations. Constrained or Feature space Maximum Likelihood Regression (fMLLR) [1] has proved to be especially effective for this purpose, particularly when used for incremental unsupervised adaptation [2]. Unfortunately the standard implementation describ...
متن کاملRandomized Block Coordinate Descent for Online and Stochastic Optimization
Two types of low cost-per-iteration gradient descent methods have been extensively studied in parallel. One is online or stochastic gradient descent ( OGD/SGD), and the other is randomzied coordinate descent (RBCD). In this paper, we combine the two types of methods together and propose online randomized block coordinate descent (ORBCD). At each iteration, ORBCD only computes the partial gradie...
متن کاملStochastic gradient adaptation of front-end parameters
This paper examines how any parameter in the typical front end of a speech recognizer, can be rapidly and inexpensively adapted with usage. It focusses on firstly demonstrating that effective adaptation can be accomplished using low CPU/Memory cost stochastic gradient descent methods, secondly showing that adaptation can be done at time scales small enough to make it effective with just a singl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998